Web Survey Bibliography
Title Investigating Response Quality in Mobile and Desktop Surveys: A Comparison of Radio Buttons, Visual Analogue Scales and Slider Scales
Author Toepoel, V.; Funke, F.
Year 2014
Access date 16.08.2016
Presentation PDF (184 kB)
Abstract
Mobile devices have smaller displays, touch screens and different methods of navigation compared to desktop computers. This may limit the amount of information that can be placed on a mobile phone screen and it can also affect how a survey is comprehended and completed.
The most traditional rating scales in Web surveys are made from radio buttons. Radio buttons require quite a lot of space. Only a limited number of response options can be presented simultaneously. Otherwise, respondents have to scroll to see all options which may bias ratings.
Visual Analogue Scales (VAS) are operated by point and click: respondents move the mouse arrow to any position on the line and after clicking the mouse button a marker appears. In contrast, slider scales have a handle visible directly on load of the Web page and ratings are 261|Page done by drag and drop. Both scales can either be implemented as discrete or continuous rating scales. A continuous implementation is especially valuable if respondents use mobile devices like smart phone where an efficient use of space is required.
A comparison is needed of radio buttons, VAS, and slider bars to see how they affect usability and data quality on mobile phones compared to regular desktop completion. Finger navigation on mobile phones is less precise than mouse navigation on desktops. This could result in selecting the wrong (not intended) answer option in radio buttons. Slider bars or VAS might be more efficient in selecting the intended response option. We look at response quality indicators, paradata, evaluation of the questionnaire as well as personal characteristics.The usability of question formats is conjectured to be related to the number of scale points. We use an experimental design with question format and 5, 7, 11 and continuous scales. Data are collected in a probability-based panel in the Netherlands.
The most traditional rating scales in Web surveys are made from radio buttons. Radio buttons require quite a lot of space. Only a limited number of response options can be presented simultaneously. Otherwise, respondents have to scroll to see all options which may bias ratings.
Visual Analogue Scales (VAS) are operated by point and click: respondents move the mouse arrow to any position on the line and after clicking the mouse button a marker appears. In contrast, slider scales have a handle visible directly on load of the Web page and ratings are 261|Page done by drag and drop. Both scales can either be implemented as discrete or continuous rating scales. A continuous implementation is especially valuable if respondents use mobile devices like smart phone where an efficient use of space is required.
A comparison is needed of radio buttons, VAS, and slider bars to see how they affect usability and data quality on mobile phones compared to regular desktop completion. Finger navigation on mobile phones is less precise than mouse navigation on desktops. This could result in selecting the wrong (not intended) answer option in radio buttons. Slider bars or VAS might be more efficient in selecting the intended response option. We look at response quality indicators, paradata, evaluation of the questionnaire as well as personal characteristics.The usability of question formats is conjectured to be related to the number of scale points. We use an experimental design with question format and 5, 7, 11 and continuous scales. Data are collected in a probability-based panel in the Netherlands.
Access/Direct link Conference Homepage (abstract) / (full tex)
Year of publication2014
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - 2014 (234)
- Detecting Insufficient Effort Responding with an Infrequency Scale: Evaluating Validity and Participant...; 2016; Huang, J. L.; Bowling, N. A.; Liu, Me.; Li, Yu.
- Evaluating Three Approaches to Statistically Adjust for Mode Effects; 2016; Kolenikov, S.; Kennedy, C.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Usability of the ACS Internet Instrument on Mobile Devices; 2015; Horwitz, R.
- Explorations in Non - Probability Sampling Using the Web; 2015; Brick, J. M.
- On Bias Adjustments for Web Surveys; 2015; Fan, L.; Lou, W.; Landsman, V.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Web panel surveys – a challenge for official statistics; 2015; Svensson, J.
- Estimation with Non-probability Surveys and the Question of External Validity; 2015; Dever, J. A.; Valliant, R. L.
- Measurement Properties of Web Surveys; 2015; Tourangeau, R.
- Improving Response to Household Surveys Using Mail Contact to Request Responses over the Internet: Results...; 2015; Dillman, D. A.
- The quality of data collected using online panels: a decade of research ; 2015; Callegaro, M.
- Sub-optimal Respondent Behavior and Data Quality in Online Surveys; 2015; Thomas, R. K.
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- Designing Bonsai Surveys: The small but perfectly formed survey experience to meet the needs of the...; 2015; Puleston, J.
- Suggestions for international research using electronic surveys; 2015; e Silva, S. C.; Duarte, P.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The effect of multiple reminders on response patterns in a Danish health survey; 2015; Christensen, A. I.; Ekholm, O.; Kristensen, P. L.; Larsen, F. B.; Vinding, A. L.; Gluemer, C.; Juel,...
- The quality of responses to grid questions as used in Web questionnaires (compared with paper questionnaires...; 2015; Dominguez, J. A.; de Rada, V. D.
- Identifying predictors of survey mode preference; 2015; Millar, M. M.; Olson, K.; Smyth, J. D.
- The Impact of Mixing Modes on Reliability in Longitudinal Studies; 2014; Cernat, A.
- Growing Beyond the Phone Tree; 2014; Hayzlett, J.
- A Comparison of Different Online Sampling Approaches for Generating National Samples; 2014; Heen, M. S. J., Lieberman, J. D., Miethe, T. D.
- Does Sequence Matter in Multimode Surveys: Results from an Experiment; 2014; Wagner, J., Arrieta, J., Guyer, H., Ofstedal, M. B.
- The Use of Cognitive Interviewing Methods to Evaluate Mode Effects in Survey Questions; 2014; Gray, M., Blake, M., Campanelli, P.
- A Mixed Methods Approach to Network Data Collection; 2014; Rice, E., Holloway, I. W., Barman-Adhikari, A., Fuentes, D., Brown, C. H., Palinkas, L. A.
- Infliential Factors on Survey Outcomes: Length of Survey, Device Selection and Extrnal Elements; 2014; Ribeiro, E.
- The Effect of Mobile Web Survey Design on Screen Orientation Manipulation; 2014; Young, R.H.; Crawford, S. D.; Couper, M. P.; Nelson, T. F.
- Investigating Response Quality in Mobile and Desktop Surveys: A Comparison of Radio Buttons, Visual...; 2014; Toepoel, V.; Funke, F.
- Do online access panels really need to allow and adapt surveys to mobile devices? ; 2014; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Why you need to make your surveys mobile friendly NOW; 2014; Lorch, J.; Mitchell, N.
- Assessing the Impact Device Choice Has on Web Survey Data Collection ; 2014; Hupp, A.; Schroeder, H. M.; Piskorowski, A.D.
- Understanding Mobility: Consent and Capture of Geolocation Data in Web Surveys; 2014; Crawford, S. D.; McClain, C.; Young, R.H.; Nelson, T. F.
- Swipe, Snap & Chat: Mobile Survey Data Collection Using Touch Question Types and Mobile OS Features ; 2014; Buskirk, T. D.; Michaud, J.; Saunders, T.
- Statistical Approaches to Analyze Self-Reported Susceptibility to Driver Distraction; 2014; Chen, H-Y. W.; Donmez, B.; Ko, Y-D.
- Using Web Panels for Official Statistics; 2014; Bethlehem, J.
- The problem of non-response in population surveys on the topic of HIV and sexuality: a comparative study...; 2014; Wallander, L.; H.; Mannheimer, L. N.; Oestergren, P. O.; Plantin, L.Tikkanen, R. H.
- Does the Length of Fielding Period Matter? Examining Response Scores of Early Versus Late Responders; 2014; Dyer Yount, N.; Lewis, T.; Lee, K.; Sigman, R.
- FocusVision 2014 Annual MR Technology Report; 2014; Macer, T., Wilson, S.
- When it comes to mobile respondent experience and data quality, survey design matters; 2014; Mitchell, N.
- The Changing Landscape of Technology and its Effect on Online Survey Data Collection; 2014; Mitchell, N.
- Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition; 2014; Dillman, D. A., Smyth, J. D., Christian, L. M.
- The survey playbook: how to create the perfect survey. (Vol.1); 2014; Champagne, M. V.
- Do your own online surveys. DYI and self serve market research; 2014; Cary, N.
- The Influence of Answer Box Format on Response Behavior on List-Style Open-Ended Questions; 2014; Keusch, F.
- Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison...; 2014; Erens, B.; Burkill, S.; Couper, M. P.; C., Clifton, S., Tanton, C., Phelps, A., Datta, J., Mercer,...
- Luteal-phase support in assisted reproduction treatment: real-life practices reported worldwide by an...; 2014; Vaisbuch, E., de Ziegler, D., Leong, M., Shoham, Z., Weissman, A.
- Facebook, Twitter, & Qr Codes: An Exploratory Trial Examining The Feasibility Of Social Media Mechanisms...; 2014; Gu, L. L.
- Time-dependent variation in the responses to the web-based ISAAC questionnaire; 2014; Yoshida, K., Sasaki, M., Odajima, H., Itazawa, T., Hashimoto, K., Furukawa, M., Adachi, Y.